专利摘要:
method for automatic exposure target adjustment, method for generating saturated pixels mask and imaging apparatus. The present invention relates to a method that sets an autoexposure target in an autoexposure operation on an image sequence, such as an infrared image sequence. the method comprises: obtaining a histogram of at least one of the images; apply a weighted histogram table to the histogram to obtain weighted histogram boxes, where at least some boxes in the histogram that contain saturated pixels are assigned a higher weighting value, and at least some boxes in the histogram that contain unsaturated pixels are assigned a weighted value. assigned a lower weighting value, and summing the weighted histogram boxes to obtain a saturation point and lowering an autoexposure target for an autoexposure operation when the saturation point exceeds a first threshold value, and increasing the exposure target automatic when the saturation point is below the first threshold value and the image is underexposed.
公开号:BR112015016525B1
申请号:R112015016525-7
申请日:2013-01-28
公开日:2021-09-14
发明作者:Sina Afrooze;Guoqian Sun
申请人:Avigilon Corporation;
IPC主号:
专利说明:

FIELD
[0001] The present invention relates, in general, to a method of automatic exposure compensation adaptable to the scene and an imaging apparatus that can be operated to perform this method. BACKGROUND
[0002] It is often desirable to use an automatic exposure algorithm in a camera to ensure that the image is obtained with a balance between shutter time and gain under different lighting conditions. Often the goal of the autoexposure algorithm is to maintain a constant total luminance for the image by adjusting the shutter time and gain. The problem with this approach is that if the scene contains objects that are very small (<1% of image pixels) but much brighter than the rest of the image, those objects will be overexposed and will appear saturated in the image. This problem is particularly serious for a camera equipped with an infrared ("IR") illuminator, as some small objects of interest (eg, license plates) have a much higher IR reflectivity than the rest of the scene. However, if the camera ensures that no areas of the image are saturated, a small backlight in the image can cause most of the image to be permanently underexposed.
[0003] There are also several cameras on the market that try to solve the problems of saturated objects that result from IR illumination by simply reducing the intensity of the IR illuminator. Controlling IR intensity to control saturation will only work for a camera with a built-in IR illuminator, which can be directly controlled. In particular, this approach will not work with cameras that use an external IR illuminator, where the illumination intensity cannot be directly controlled. SUMMARY
[0004] According to an aspect of the invention, a method is provided for setting an automatic exposure target in an automatic exposure operation on an image sequence, such as an infrared image sequence. The method comprises obtaining a histogram of an image; apply a weighted histogram table to the histogram to obtain weighted tonal value boxes, whereby at least some boxes in the histogram that contain overlapping saturated pixels are assigned a higher weight value, and at least some boxes in the histogram that contains unsaturated pixels, is assigned a lower weight value, and sums the weighted histogram boxes to get a saturation point and decreases an auto exposure target for an auto exposure operation when the saturation point indicates the image is oversaturated , and increase the auto exposure target when the saturation point indicates the image is unsaturated and when the image is underexposed.
[0005] Before obtaining an image histogram, the saturation pixels of static or mostly static backgrounds in the image can be masked by identifying pixels in the image that have exceeded a saturation threshold a selected number of times during a period of time; and applying a mask to the identified pixels so that the masked pixels are not included in the histogram. The step of identifying pixels as static or mostly static background pixels may comprise reading a saturation level of a pixel and adding a counter when the pixel exceeds the saturation threshold, and identifying the pixel as static or mostly static background saturated pixel. static when the counter exceeds a background saturated pixel threshold value. The method may further comprise unmasking masked pixels by reading a saturation level of a masked pixel and removing a counter when the masked pixel is below the saturation threshold, and removing the mask from the masked pixel when the counter is below a value. Background saturated pixel limit.
[0006] The histogram can be obtained from only one of each selected number of images in the image sequence, the selected number of images being greater than the number of images in the image sequence to which the exposure operation automatic is applied. The weighted histogram table can have a weight value that increases linearly from the boxes assigned the lowest weight value to the boxes assigned the highest weight value, with the linear increase occurring within a selected range of boxes . The largest weight value can be 1 and the smallest weight value can be 0.
[0007] The method may further comprise decreasing the auto exposure target by a first amount when the image is highly supersaturated, and decreasing the auto exposure target by a second amount when the image is slightly supersaturated, where the saturation point when the image is highly saturated it is higher than the saturation point when the image is slightly saturated. The first amount can be a function of the original auto exposure target, and the second amount can be a function of the average total exposure level of the image. Images can be infrared images.
[0008] According to another aspect of the invention, an imaging apparatus is provided and comprises: an imager; and a processing circuit in communication with the imager for receiving an image captured by the imager. The processing circuit comprises a processor and a memory which has encoded in it a program code executable by the processor to carry out the aforementioned method.
[0009] According to another aspect of the invention, a method is provided to generate a mask of saturated background objects in an image, comprising: identifying pixels in the image that exceed a saturation threshold a selected number of times during a period of time as static or mostly static background saturated pixels; and assigning a mask indicator to each pixel in the image identified as a mostly static background saturated pixel. The method may further comprise identifying masked pixels in the image that are below a saturation threshold a selected number of times over a period of time as unsaturated pixels, and removing the mask indicator from the masked pixels that are unsaturated.
[00010] The method may further comprise two saturation thresholds, particularly an upper saturation threshold above which a pixel is considered saturated and a lower saturation threshold below which a pixel is considered unsaturated. The step of identifying pixels as static or mostly static background pixels may comprise reading a saturation level of a pixel and adding a counter when the pixel exceeds the saturation threshold and identifying the pixel as a static background saturated or mostly static pixel when the counter exceeds a background saturated pixel threshold value. The step of identifying masked pixels that are unsaturated may comprise reading a saturation level of a masked pixel and removing a counter when the masked pixel is below the saturation threshold and identifying the masked pixel as unsaturated when the counter is below a threshold value Saturated Pixel Background. BRIEF DESCRIPTION OF THE DRAWINGS
[00011] Figures 1A, 1B and 1C are schematic block diagrams of three modes of an imaging apparatus, in which a first mode comprises an imaging apparatus such as an IR illuminator having a pair of sets of infrared emitting diodes ( IRED), a second embodiment comprises an imaging apparatus that has an IR illuminator with an IRED assembly, and a third embodiment comprises an imaging apparatus that uses an external IR illuminator to provide IR illumination. Each modality also comprises a memory which has, encoded therein, an executable program for performing a scene-adaptive automatic exposure compensation method for an automatic exposure operation.
[00012] Figure 2 is a perspective view of the imaging apparatus incorporated as a security camera.
[00013] Figure 3 is an exemplary weighted histogram table to be applied to a histogram of an image captured by the imaging apparatus to perform the scene adaptive auto compensation method.
[00014] Figure 4 is a flowchart showing the steps of a first component of the scene adaptive automatic compensation method, particularly the steps to generate a background saturation mask.
[00015] Figures 5A and 5B is a flowchart showing steps of a second component of the scene adaptive auto compensation method, particularly steps to control the saturation level in an auto exposure operation. DETAILED DESCRIPTION
[00016] The modalities described in this document provide a scene adaptive automatic exposure compensation method used in an automatic exposure operation of an imaging apparatus such as a security camera. The scene adaptive automatic exposure compensation method comprises two components, particularly a first component which masks the saturated objects in the background of the image ("background saturation mask generator"), and a second component which provides a compensation value for different parts of an image histogram when calculating an autoexposure target value for use in an image averaging autoexposure operation ("autoexposure supplement to control the saturation level"). The first component causes the auto-exposure operation to ignore background-saturated objects, which would otherwise cause the auto-exposure operation to result in the underexposure of objects of interest in the image. The second component controls the saturation levels in the image by reducing the auto exposure target value of the auto exposure operation when certain parts of the image are oversaturated, and regressing the auto exposure target value to the original auto exposure target value when none area in the image is oversaturated or the oversaturated areas are masked by the background saturation mask.
[00017] In other words, the method incorporated by the present modalities controls the exposure (shutter time and gain) to avoid saturation in an IR illuminated scene, instead of avoiding saturation by controlling the intensity of the IR illuminator, as in approaches conventional ones. This allows the method to be used on imaging devices that have built-in or external IR illuminators, as the IR illumination intensity control is not used to control saturation levels. Furthermore, the method that avoids, or at least reduces, image saturation and ignores saturated background objects is expected to increase image quality, in particular, to preserve image detail for objects of interest in the image. An additional benefit of avoiding or reducing saturation through exposure control is the ability to reduce motion blur for moving objects in the image by reducing image exposure.
[00018] The method is embedded as program code stored in a memory of an imaging device and executed by a processor in that device. The imaging apparatus may feature an integrated IR illuminator, such as that shown in Figures 1A and 1B, or use an external IR illuminator, such as that shown in Figure 1C.
[00019] Referring now to Figure 1A, an imaging apparatus 10 according to an embodiment comprises the following main components: an adjustable focus lens 12, an imager 14 optically coupled to the adjustable focus lens 12, a lens driver 16 mechanically coupled to adjustable focus lens 12 and operable to change the focal length of the adjustable focus lens, an IR illuminator 18 comprising a pair of IR emitters 18(a), 18 (b), each producing an IR illumination beam with a different linear profile (respectively, "wide-angle IR emitter" 18(a) and "narrow-angle IR emitter" 18(b)), a current driver 20(a), 20(b) for each IR emitter 18(a), 18(b), and processing and control circuit 22 in communication with the imager 14, the lens driver 16 and the current 20(a), 20(b).
[00020] The imaging apparatus 10 can be incorporated as a security camera like the one shown in Figure 2. The security camera 10 has a housing 30, which houses the aforementioned main components of the imaging apparatus, and a movable assembly 32 to mount camera 10 on a surface such as a ceiling. Adjustable focus lens 12 is mounted on the front of camera 10, and a printed circuit board ("PCB", not shown) is also mounted on the front of camera 10 around adjustable focus lens 12; the wide angle IR emitter 18(a) and the narrow angle IR emitter 18(b) are respectively mounted on this PCB and face in the same direction as the adjustable focus lens 12 and serve to illuminate the field of view. of adjustable focus lens with infrared light.
[00021] Each IR emitter 18(a), 18(b), in this embodiment, comprises an array of infrared emitting diodes (IRED) 34. Such IRED arrays are known in the art; an appropriate IRED set comprises a pair of Osram branded SFH4715S IREDs. Each IR emitter 18(a), 18(b) also comprises a microlens 36 for each IRED 34; the microlens 36 being configured to shape the IRED emission into an IR beam having a particular illumination pattern and a particular linear profile. In particular, the microlenses 36 for the wide angle IR emitter 18(a) will produce an IR beam with a linear profile that is relatively widely dispersed (hereafter referred to as the "wide beam component"), and the microlenses 36 for the narrow angle IR emitter 18(b) will produce an IR beam with a linear profile that is relatively narrowly dispersed (hereafter referred to as the "narrow beam component"). Such microlenses are known in the art; an appropriate microlens can be supplied from Ledil.
[00022] Current drivers 20(a), 20(b) are designed to regulate the current delivered to IR emitters 18(a), 18(b). Current drivers 20(a), 20(b) can be controlled to deliver the total available energy in full to one or another of the IR emitters 18(a), 18(b), or to vary the proportion of energy between the two issuers 18(a), 18(b). Such current drivers are known in the art; a suitable current driver is the On Semiconductor brand AL8805 Buck LED Driver. Current drivers 20(a), 20(b) are communicatively coupled with a general purpose input/output (GPIO) pin 38 on a circuit board within the housing which contains the processing circuit 22 (also known as the main on-chip (SoC) system of the security camera 10. The processing circuit 22 comprises an interface bus with pins 42, 44 which are communicatively coupled to the lens driver 16 and the imager 14. The imager 14 is configured to capture light in the infrared spectrum, and may be, for example, a digital sensor such as a complementary metal oxide semiconductor (CMOS) sensor. The specifications of the imager 14 and adjustable focus lens 12 can be selected according to an operator's performance requirements and expectations. The operation of the adjustable focus lenses and imaging sensors in a security camera is well known in the art and therefore the operation of the imager 14, the lens driver 16 and the adjustable focus lens 12 will not be described in greater detail here. .
[00023] The processing circuit 22 also includes a processor and a memory (CPU) 40 which has, encoded therein, a program code that is executed by the processor to operate the security camera 10. This program code includes instructions for send a control signal from GPIO pin 38 to current drivers 20(a), 20(b) to produce the IR beam. The program code may also include instructions to combine the wide beam component and the narrow beam component at one or more different energy ratios to produce a combined IR beam that has different linear profiles. As will be described in more detail below, the program code also includes instructions for performing an automatic exposure operation and the scene-adapted automatic exposure compensation method that can be used in the automatic exposure operation.
[00024] Now, referring to Figure 1B, the second modality of the imaging apparatus 10 is the same as the first modality except that this imaging apparatus only consists of a single IRED assembly, to produce an IR illumination beam fixed-amplitude (ie, non-variable linear profile) rather than a variable-amplitude IR illumination beam (ie, variable linear profile). Like the first embodiment, this second embodiment also comprises processing circuit 22 having a memory encoded with a program code that includes instructions for performing an automatic exposure operation and the scene-adaptive automatic exposure compensation method that can be used in auto exposure operation.
[00025] Now, referring to Figure 1C, the third modality of the imaging apparatus 10 is the same as the first and second modalities, except that this imaging apparatus does not have a built-in IR illuminator, but uses an external IR illuminator to illuminate the scene. Like the first and second embodiments, this third embodiment also comprises processing circuit 22 that has a memory encoded with a program code that includes instructions for performing an automatic exposure operation and the automatic exposure compensation method adaptable to the scene that can be used in auto exposure operation. METHOD FOR AUTOMATIC EXPOSURE COMPENSATION ADAPTABLE TO THE SCENE
[00026] Now, with reference to Figures 3, 4, 5A and 5B, the scene adaptive auto exposure compensation method will be described in detail. As noted above, the method comprises two components, particularly, (1) a scene-adaptive background saturation mask generator; and (2) an automatic exposure complement to control the saturation level. The first component will create a mask that identifies regions of the image that are saturated but can be considered static or mostly static and part of the background; this mask can be used by the second component to ignore the saturated regions of the background when controlling the image's saturation level. The second component ensures that the image saturation level is below a threshold level (usually 0) by processing an image histogram through the use of a weighted histogram table, and setting new auto exposure targets for the auto exposure operation when the processed histogram values do not fit within a target threshold range.
[00027] The automatic exposure operation can be based on a conventional automatic exposure algorithm based on an image average that adjusts the exposure of a series of images (frames) taken over a period of time. Such conventional autoexposure algorithms are well known in the art and therefore the autoexposure operation is not described in further detail here.
[00028] With particular reference to Figure 4, the scene-adaptive background saturation mask comprises a series of instructions executed by the processor to identify saturated pixels in the image that are static or mostly static and thus can be considered part of the image background (assuming that, at least for security surveillance purposes, objects of interest will be moving in the image). Static and mostly static objects that are found to be saturated will be masked, that is, ignored by the automatic exposure complement to control the saturation level of the image. This ensures that the exposure is not too low if the background image contains small bright objects such as street lights.
[00029] The background saturation mask component of the method works by processing each pixel i of each of the captured images ("image[i]") into one pixel by a pixel basis (from i=0 to NumberOfPixels). When starting with the first pixel i=0 (step 100), the saturation level of each pixel i is read (102); assuming the saturation level is represented by a byte of data, there can be 256 measurable saturation levels t to t = 0 to 255.
[00030] The saturation level read from each pixel i is compared to an upper saturation threshold T1 above which the pixel is considered saturated (step 104), and a lower saturation threshold T0 below which the pixel is considered unsaturated ( step 106). The values of these limits depend on the type of sensor used; common values can be, for example, T1=200 and T0=190 for a saturation range that has 256 saturation levels.
[00031] Alternatively, but not shown in Figure 4, only a saturation threshold T can be provided, above which the pixel is considered saturated, and below which the pixel is considered unsaturated.
[00032] To determine whether the pixel represents a static or mostly static saturated object, a counter ("counter[i]") is provided, which is increased by adding U when the saturation level reading exceeds the upper saturation threshold T1 (step 108), and decreased by decreasing D when the saturation level reading is below the lower saturation limit T0 (step 110). A clipping function is applied to keep the counter value between 0 and a selected maximum ("maxCount") (step 112); when the counter is below 0, counter[i] will be set to 0, and when the counter exceeds maxCount, counter[i] will be set to maxCount. If the saturation level reading equals or is between the upper and lower saturation limits T1, T0, the counter is not changed.
[00033] The counter value counter[i] is then compared to a background saturated pixel threshold value M that represents a threshold above which pixel i is considered static or mostly static and saturated (step 114). When the counter[i] counter value is above this threshold, then the pixel i in question is added to the mask by associating a mask indicator with that pixel and storing that indicator in memory (step 116); if that pixel i already contains a mask indicator (previously determined by the method for a previous image), then no action is taken, and the mask indicator remains associated with that pixel i.
[00034] When the counter value counter[i] is below the background saturated pixel threshold value M, the pixel in question is removed from the mask by deleting the mask indicator associated with that pixel (step 118) if such a mask exist. If there is no mask indicator associated with that pixel then no action is taken.
[00035] The method with background saturation mask component is now complete for the pixel in question, and the method now advances to the next pixel in the image (step 120) and steps 102 to 118 are repeated (step 122) . Once all i pixels in the image have been processed, the method advances to the next image in time sequence, and the method is performed again on a pixel-by-pixel basis.
[00036] It can be seen that the time constant for determining when a pixel is considered a background saturated pixel can be adjusted by adjusting each or all of the background saturated pixel threshold values M, and by the increment values and decrease U, D. Adjusting these values will essentially adjust the time period (number of image frames) to determine when a pixel is considered static enough to be considered part of the background. For example, increasing the M limit will cause the method to take longer (process more image frames) for the counter value to increase to the M limit. Conversely, increasing the U increase will decrease the amount of time it takes the counter to reach the M threshold. These M, U, D values can be adjusted based on trial and error to calibrate the method so that it accurately determines whether a shiny object is static (each repetition of the method will advance the counter by increments U) or if moving (one or more iterations of the method will increment the counter when the bright object is within the pixel's field of view, and will remove the counter when the bright object has moved outside the pixel's field of view).
[00037] It can also be seen that the method can be calibrated to capture "mostly static" objects, such as flashing lights as background saturated objects, by adjusting U and D increments accordingly. For example, U can be assigned a higher value than D, in which case the counter will slowly add in a number of image frames until counter[i] is higher in number than the threshold M. For example, U can be assigned a value of 2 and D a value of 1.
[00038] Now, referring to Figures 3, 5A and 5B, the autoexposure complement to control the saturation level component of the method ("autoexposure complement component") involves creating a histogram of an image after saturated objects have been masked by the background saturation mask generator, then apply a weighted histogram table to the histogram to obtain a "saturation point", which can be considered to be the degree to which the image is saturated, and the which is used to determine whether new autoexposure target values should be used by the autoexposure operation when the image is found to be oversaturated or unsaturated but underexposed. It is expected that when the auto exposure operation uses the auto exposure complement of the present method, the image saturation level can be kept below a threshold value, thus improving image quality, especially in parts of the image. that are of interest.
[00039] The autoexposure complement component of the method is initiated by assigning a current autoexposure target value ("aeTrgt") to be equal to the original autoexposure target associated with the autoexposure operation (" origAeTrgt") (step 200). The original auto exposure value can be user-set or predetermined in the auto exposure operation program code. Then, a series of steps (steps 204 to 208) are performed to run the autoexposure complement component on only one of all selected K numbers of image frames in order to ensure that the autoexposure complement component cycles. and the cycle of automatic exposure operation do not interact to cause image instability. For example, K can be four, in which case a counter ("imgCount") is employed to count each image frame (step 206) and ensure that the automatic exposure complement is performed every fourth image frame in sequence (step 208 ).
[00040] The background saturation mask generator is applied to images including the K-th image to which the auto exposure complement component is applied (step 210), in the manner described above. Then, an image histogram of the masked image is obtained in a manner that is well known in the art (step 212). An image histogram is a type of histogram that acts as a graphical representation of the tonal distribution in a digital image. It plots the number of pixels for each tonal value box within a range of tonal values.
[00041] Then, a weighted histogram table like the one shown in Figure 3 is applied to the histogram to obtain the saturation point of the image (step 214). The weighted histogram table assigns a WV weight value to each box in the histogram; for example, for an image histogram that has a one-byte tonal range, there will be 256 separate boxes (bin[0] to bin[255]) each representing a different tonal value. The weighted histogram table assigns a weight value of 0 to all tonal value boxes that are considered unsaturated, from the lowest tonal value (bin[0]) to a selected first tonal value k (bin[k -1]), and assigns a weight value of 1 to all tonal value boxes that are found to be supersaturated, from a second selected tonal value m [bin m+1] to the highest tonal value (bin[255 ]). The weight value increases linearly from 0 to 1, from bin[k] to bin[m], to provide a smooth transition during exposure changes. The first and second k, m tonal values can be selected based on what the operator would consider to be an acceptable amount of saturation in the image, and could, for example, be between 200 and 230 respectively. The saturation point ("SaturationScore") is the sum of the histogram box multiplication ("Historgam[i]") and its corresponding weight value ("WeightingTable[i]"):

[00042] Once the saturation point has been determined for the image, the saturation point is compared to an upper limit value maxScore (step 216) and a lower limit value minScore (step 218). When the saturation point is greater than the maxScore upper limit value, the image is considered highly supersaturated, and when the saturation point is below the minScore lower limit value, the image is considered unsaturated. When the saturation point is between maxScore and minScore, the image is considered slightly oversaturated. As will be discussed below, steps are taken to decrease the auto exposure target when the image is considered oversaturated to reduce the degree of saturation in the image, and to increase the auto exposure target when the image is considered unsaturated and underexposed. t to improve the exposure in the image without making the image oversaturated.
[00043] The maxScore and minScore values are determined empirically and will depend on an imager used and operator preferences. A typical value for maxScore might be 20 pixels and a typical value for minScore might be 0.
[00044] When the saturation point is above maxScore, the current auto exposure target aeTrgt is reduced by a constant high factor L from the original value of the same origAeTrgt (step 220) in order to cause the exposure to be reduced relatively quickly, where aeTrgt = origAeTrgt*L
[00045] The constant high factor L is empirically determined and may depend on operator preferences. A typical value for L might be 0.1. The auto exposure target for the auto exposure operation (AutoExposureTarget) is then set to the current auto exposure target aeTrgt (step 232), the auto exposure operation is performed to adjust the exposure to suit the new target autoexposure, and the method returns to step 202.
[00046] When the saturation point is between minScore and maxScore, the current auto exposure target aeTrgt is reduced from the current average total exposure level of the image (GlobalImageMean) by a constant low factor S in order to do with the exposure to decrease relatively slowly (step 222): aeTrgt = aeMean*S
[00047] where aeMean = GlobalImageMean (step 221)
[00048] The constant low factor S has a higher value than the constant high factor L and can be a factor that is linearly interpolated between the values S1 and S2 based on the saturation point, and determined by: S = (S2+ (S2-S1)*( saturationscore-minScore)/(maxScore-minScore))
[00049] where, S1 and S2 represent the degree of change in the automatic exposure target. S1 and S2 are determined empirically and may depend on user preferences. S1 is larger than S2 and can be, for example, 0.9 and 0.5 respectively. The auto-exposure target for the auto-exposure operation (Au-toExposureTarget) is then set to the current auto-exposure target aeTrgt (step 232), the auto-exposure operation is performed to adjust the exposure to suit the new auto exposure target, and the method returns to step 202.
[00050] When the saturation point is less than minScore, the image is considered unsaturated, and the auto exposure target aeTrgt is increased if the image is underexposed; underexposed, in this context, means that pixels that have the highest tonal values in the image have an average tonal value that is below a tonal value threshold, above which a pixel is considered oversaturated. Alternatively, underexposed can mean when the pixel that has the highest tonal value in the image is less than the threshold tonal value. Assuming the current autoexposure target aeTrget is smaller than the original autoexposure target (step 224), a series of steps are taken to increase the autoexposure target if the image is actually underexposed. In order to determine whether the image is underexposed, the average tonal value of a selected number of top n pixels of the hiMean histogram is determined (step 226). In other words, the n pixels in the image with the highest tonal values are identified, and their tonal values are averaged. The value of n is empirically determined; a typical value for n might be 10. Then, a constant maxMean value is selected, which represents the tonal value threshold above which a pixel is considered supersaturated; this value can be determined empirically and will, in part, be dependent on the imaging sensor used to capture the image. A typical value for maxMean is a value that is less than or equal to the second m total value used in the weighted histogram table, which in this case is 230.
[00051] Since saturation point is less than minScore, hiMean should be close to maxMean. The auto exposure target is then set to the image's current average total exposure level (aeMean = GlobalImageMean) (step 225) scaled by the ratio of hiMean to maxMean (step 228): aeTrgt = aeMean*(maxMean/hiMean)
[00052] Therefore, when hiMean is less than maxMean the aspect ratio is greater than 1 and the auto exposure target will be set to a value that is greater than the current average total exposure level of the image. It can be seen that when hiMean equals maxMean, the image is not considered to be underexposed and thus the auto exposure target is set to the average total exposure level of the image.
[00053] The auto exposure target must not grow above the original auto exposure target; if the results scaled above on the autoexposure target aeTrgt are higher than the original autoexposure target origAeTrgt, then the autoexposure target is set to equal the original autoexposure target (step 230): aeTrgt = MIN (origAeTrgt, aeTrgt)
[00054] Finally, the auto exposure target for the auto exposure operation (AutoExposureTarget) is set to the current auto exposure target aeTrgt and the auto exposure operation is executed, and the method returns to step 202.
[00055] Although the present invention has been described herein by the preferred embodiments, it will be understood by those skilled in the art that various changes can be made and added to the invention. Changes and alternatives are considered within the spirit and scope of the present invention.
权利要求:
Claims (12)
[0001]
1. Method for adjusting an autoexposure target in an autoexposure operation on a sequence of images, characterized in that it comprises: (a) before obtaining a histogram of an image, masking static or mostly static background saturated pixels in the image by identifying pixels in the image that have exceeded a saturation threshold a selected number of times during a period of time; and applying (210) a mask to the identified pixels such that the masked pixels are not included in the histogram; (b) obtaining (212) a histogram of an image; (c) apply (214) a weighted histogram table to the histogram to obtain weighted tonal value boxes, at least some boxes in the histogram that contain saturated overlapping pixels are assigned a higher weight value, and at least some boxes in the histogram containing unsaturated pixels a lower weight value is assigned, and (d) summing the weighted histogram boxes to obtain a saturation point and when the saturation point indicates that the image is oversaturated, decrease a target value for the autoexposure in connection with an autoexposure operation and increasing the target value when the saturation point indicates that the image is unsaturated (224) and when the image is underexposed.
[0002]
2. Method according to claim 1, characterized in that the step of identifying pixels as static or mostly static background pixels comprises reading a saturation level of a pixel and adding (108) a counter when the pixel exceeds the saturation threshold and identify the pixel as static background saturated pixel or mostly static when the counter exceeds a background saturated pixel threshold value.
[0003]
3. Method according to claim 1, characterized in that it further comprises unmasking masked pixels by reading a saturation level of a masked pixel and removing (110) a counter when the masked pixel is below the saturation threshold , and remove the mask from the masked pixel when the counter is below a background saturated pixel threshold value.
[0004]
4. Method according to claim 1, characterized in that the histogram is obtained from only one among each selected number of images in the image sequence, the selected number of images being greater than the number of images in the image sequence to which the auto exposure operation is applied.
[0005]
5. Method according to claim 4, characterized in that the weighted histogram table has a weight value that increases linearly from the boxes assigned with the lowest weight value to the boxes assigned with the most weight value high, with the linear increase occurring within a selected range of boxes.
[0006]
6. Method according to claim 5, characterized in that the largest weight value is 1 and the smallest weight value is 0.
[0007]
7. Method according to claim 6, characterized in that it further comprises decreasing the target value by a first amount when the image is highly supersaturated, and decreasing the target value by a second amount when the image is slightly supersaturated , where the saturation point when the image is highly saturated is higher than the saturation point when the image is slightly saturated.
[0008]
8. Method according to claim 7, characterized in that the first quantity is a function of the target value, and the second quantity is a function of the average total exposure level of the image.
[0009]
9. Method according to any one of claims 1 to 8, characterized in that the images are infrared images.
[0010]
10. Method according to claim 1, characterized in that the image is underexposed when a selected number of pixels with the highest tonal values in the image have an average tonal value that is below a tonal value threshold above the which a pixel is considered supersaturated.
[0011]
11. Method according to claim 10, characterized in that the target value is increased by the average total exposure level of the image multiplied by the tonal value threshold ratio above which a pixel is considered supersaturated for the tonal value Average of the selected number of pixels that have the highest tonal values in the image.
[0012]
12. Imaging apparatus (10) characterized in that it comprises: (a) an imager (14); and (b) the processing circuit (22) in communication with the imager (14) for receiving an image captured by the imager (14), the processing circuit (22) comprising a processor and a memory (40) having encoded therein the program code executable by the processor for performing a method for adjusting an automatic exposure target in an automatic exposure operation on a sequence of images, as defined in any one of claims 1 to 11.
类似技术:
公开号 | 公开日 | 专利标题
BR112015016525B1|2021-09-14|METHOD FOR ADJUSTING AUTOMATIC EXPOSURE TARGET, METHOD FOR GENERATING SATURATED PIXEL MASK AND IMAGING APPARATUS
JP5610762B2|2014-10-22|Imaging apparatus and control method
JP4068614B2|2008-03-26|Exposure compensation method and system using metric matrix and flash
JP2004212385A|2004-07-29|Photographic device, photographing method and control method for the photographic device
TWI608735B|2017-12-11|Image capturing device and brightness adjusting method
BR112014027944B1|2022-02-01|Method for generating an infrared beam for illuminating a scene to be imaged and apparatus for illuminating a scene to be imaged with infrared radiation
JP5728498B2|2015-06-03|Imaging apparatus and light emission amount control method thereof
JP2006270218A|2006-10-05|Electronic camera and electronic camera system
JP2009204734A|2009-09-10|Method for adjusting light distribution, illuminator and imaging apparatus
TWI407780B|2013-09-01|Methods for image exposure correction
US10771717B2|2020-09-08|Use of IR pre-flash for RGB camera&#39;s automatic algorithms
JP6303304B2|2018-04-04|camera
US20070264000A1|2007-11-15|Image Extraction Apparatus and Flash Control Method for Same
JP2016139959A|2016-08-04|Dimming control apparatus, control method thereof, control program, and imaging apparatus
CN109076199B|2021-06-15|White balance adjustment device, working method thereof and non-transitory computer readable medium
JP2015034850A|2015-02-19|Photographing device and photographing method
JP2007171518A|2007-07-05|Imaging apparatus, method for controlling the same and control program
CN110187591B|2021-09-17|Flash lamp control method and device, electronic equipment and storage medium
JP2019144478A|2019-08-29|Lighting device, imaging system, method for controlling lighting device, and program
KR20110017556A|2011-02-22|Camera module
JP2009276560A|2009-11-26|Imaging apparatus and imaging method
JP6127583B2|2017-05-17|Imaging device
KR101015669B1|2011-02-22|Apparatus and method for controlling intensity ofcamera flash
JP2014232234A|2014-12-11|Exposure control device and optical apparatus
JP2009188847A|2009-08-20|Electronic camera
同族专利:
公开号 | 公开日
EP2946249B1|2018-05-23|
AU2013374190B2|2016-11-10|
EP2946249A1|2015-11-25|
US9615032B2|2017-04-04|
KR101883037B1|2018-07-27|
US9536292B2|2017-01-03|
BR112015016525A2|2017-07-11|
CN105190424B|2018-06-01|
EP2946249A4|2016-11-02|
CN105190424A|2015-12-23|
HK1216192A1|2016-10-21|
IL239679D0|2015-08-31|
MX349155B|2017-07-14|
CA2896825C|2019-09-10|
SG11201505509RA|2015-08-28|
MX2015009082A|2016-03-11|
NZ709731A|2016-12-23|
CA2896825A1|2014-07-24|
JP2016510532A|2016-04-07|
KR20150139826A|2015-12-14|
US20160142608A1|2016-05-19|
US20140198218A1|2014-07-17|
JP5981053B2|2016-08-31|
WO2014110654A1|2014-07-24|
AU2013374190A1|2015-07-23|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US6816200B1|1998-08-31|2004-11-09|Neostar, Inc.|Method and apparatus for detecting camera sensor intensity saturation|
JP3465632B2|1999-06-04|2003-11-10|日本電気株式会社|Object detection device and object detection method|
US20040100565A1|2002-11-22|2004-05-27|Eastman Kodak Company|Method and system for generating images used in extended range panorama composition|
US7327504B2|2002-12-06|2008-02-05|Eastman Kodak Company|Method of detecting clipped image pixels|
FI116246B|2003-06-30|2005-10-14|Nokia Corp|Method and system for adjusting the exposure in digital imaging and the like|
US7538801B2|2003-09-15|2009-05-26|Micron Technology, Inc.|Region-based auto gain control and auto exposure control method and apparatus|
JP4580820B2|2005-06-02|2010-11-17|オリンパスイメージング株式会社|Image display device|
US20070024721A1|2005-07-29|2007-02-01|Rogers Sean S|Compensating for improperly exposed areas in digital images|
KR100793230B1|2006-07-24|2008-01-10|엘지전자 주식회사|Apparatus and method for compensating a partial back light in camera|
JP4305777B2|2006-11-20|2009-07-29|ソニー株式会社|Image processing apparatus, image processing method, and program|
US7800657B2|2006-12-29|2010-09-21|Micron Technology, Inc.|Method, apparatus and system using hierarchical histogram for automatic exposure adjustment of an image|
GB2464441B|2007-07-25|2012-10-17|Hiok Nam Tay|Exposure control for an imaging system|
CN101919246A|2007-08-08|2010-12-15|托尼·迈耶|Non-retro-reflective license plate imaging system|
JP4877319B2|2008-12-24|2012-02-15|カシオ計算機株式会社|Image generating apparatus, program, image display method, and imaging method|
JP5469527B2|2010-05-10|2014-04-16|パナソニック株式会社|Imaging device|
JP2013005017A|2011-06-13|2013-01-07|Sony Corp|Image pickup apparatus, image pickup apparatus control method, and program|
JP5950196B2|2011-08-30|2016-07-13|株式会社リコー|Imaging apparatus, and image analysis apparatus and moving apparatus using the same|
JP2013197892A|2012-03-19|2013-09-30|Fujitsu Ltd|Object recognition apparatus, object recognition method, and computer program for object recognition|CN104702849B|2014-01-10|2018-03-30|杭州海康威视数字技术股份有限公司|A kind of thermal camera and its infrared lamp luminance regulating method|
CN110708445B|2014-01-10|2021-11-23|威智伦公司|Camera housing for reducing internal reflections|
JP6345500B2|2014-06-20|2018-06-20|株式会社ソシオネクスト|Digital camera exposure control method and digital camera|
US9894284B2|2014-07-18|2018-02-13|Sony Semiconductor Solutions Corporation|Imaging control device, imaging apparatus, and imaging control method|
CN107454344B|2015-06-15|2019-05-24|Oppo广东移动通信有限公司|A kind of acquisition parameters setting method and user terminal|
CN104954700B|2015-06-17|2018-09-14|浙江宇视科技有限公司|A kind of overexposure control method and device|
TWI581632B|2016-06-23|2017-05-01|國立交通大學|Image generating method and image capturing device|
KR20180078961A|2016-12-30|2018-07-10|삼성전자주식회사|Image capturing device and methos of operating the same|
TW201822709A|2016-12-30|2018-07-01|曦威科技股份有限公司|Real-time heart rate detection method and real-time heart rate detection system therefor|
CN110476416B|2017-01-26|2021-08-17|菲力尔系统公司|System and method for infrared imaging in multiple imaging modes|
US11238281B1|2017-02-27|2022-02-01|Amazon Technologies, Inc.|Light source detection in field of view|
CN108668088A|2017-04-02|2018-10-16|田雪松|Sense image formation control method and device|
WO2018226437A1|2017-06-05|2018-12-13|Adasky, Ltd.|Shutterless far infraredcamera for automotive safety and driving systems|
GB2576459B|2017-06-05|2022-03-16|Avigilon Corp|Spherical camera|
US10699386B2|2017-06-05|2020-06-30|Adasky, Ltd.|Techniques for scene-based nonuniformity correction in shutterless FIR cameras|
US11012594B2|2017-06-05|2021-05-18|Adasky, Ltd.|Techniques for correcting oversaturated pixels in shutterless FIR cameras|
US10929955B2|2017-06-05|2021-02-23|Adasky, Ltd.|Scene-based nonuniformity correction using a convolutional recurrent neural network|
US10511793B2|2017-06-05|2019-12-17|Adasky, Ltd.|Techniques for correcting fixed pattern noise in shutterless FIR cameras|
US10491808B1|2017-06-27|2019-11-26|Amazon Technologies, Inc.|Detecting sunlight in images|
RU2667790C1|2017-09-01|2018-09-24|Самсунг Электроникс Ко., Лтд.|Method of automatic adjustment of exposition for infrared camera and user computer device using this method|
TWI697867B|2018-12-12|2020-07-01|晶睿通訊股份有限公司|Metering compensation method and related monitoring camera apparatus|
EP3709628B1|2019-03-14|2020-12-23|Axis AB|Control of an illuminator|
法律状态:
2018-12-04| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-02-27| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-08-24| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-09-14| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 28/01/2013, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US201361752889P| true| 2013-01-15|2013-01-15|
US61/752,889|2013-01-15|
PCT/CA2013/050060|WO2014110654A1|2013-01-15|2013-01-28|Imaging apparatus with scene adaptive auto exposure compensation|
[返回顶部]